Approximate Maximum A Posteriori Inference with Entropic Priors
نویسنده
چکیده
In certain applications it is useful to fit multinomial distributions to observed data with a penalty term that encourages sparsity. For example, in probabilistic latent audio source decomposition one may wish to encode the assumption that only a few latent sources are active at any given time. The standard heuristic of applying an L1 penalty is not an option when fitting the parameters to a multinomial distribution, which are constrained to sum to 1. An alternative is to use a penalty term that encourages low-entropy solutions, which corresponds to maximum a posteriori (MAP) parameter estimation with an entropic prior. The lack of conjugacy between the entropic prior and the multinomial distribution complicates this approach. In this report I propose a simple iterative algorithm for MAP estimation of multinomial distributions with sparsity-inducing entropic priors.
منابع مشابه
Consistency of Sequence Classification with Entropic Priors
Entropic priors, recently revisited within the context of theoretical physics, were originally introduced for image processing and for general statistical inference. Entropic priors seem to represent a very promising approach to “objective” prior determination when such information is not available. The attention has been mostly limited to continuous parameter spaces and our focus in this work ...
متن کاملBayesian Inference Featuring Entropic Priors
The subject of this work is the parametric inference problem, i.e. how to infer from data on the parameters of the data likelihood of a random process whose parametric form is known a priori. The assumption that Bayes’ theorem has to be used to add new data samples reduces the problem to the question of how to specify a prior before having seen any data. For this subproblem three theorems are s...
متن کاملSimultaneous Multi-frame MAP Super-Resolution Video Enhancement using Spatio-temporal Priors
A simultaneous multi-frame super-resolution video reconstruction procedure, utilizing spatio-temporal smoothness constraints and motion estimator confidence parameters is proposed. The ill-posed inverse problem of reconstructing super-resolved imagery from the low resolution, degraded observations is formulated as a statistical inference problem and a Bayesian, maximum a-posteriori (MAP) approa...
متن کاملEntropic Priors for Discrete Probabilistic Networks and for Mixtures of Gaussians Models
The ongoing unprecedented exponential explosion of available computing power, has radically transformed the methods of statistical inference. What used to be a small minority of statisticians advocating for the use of priors and a strict adherence to bayes theorem, it is now becoming the norm across disciplines. The evolutionary direction is now clear. The trend is towards more realistic, flexi...
متن کاملMaximum Entropy, Fluctuations and Priors
The method of maximum entropy (ME) is extended to address the following problem: Once one accepts that the ME distribution is to be preferred over all others, the question is to what extent are distributions with lower entropy supposed to be ruled out. Two applications are given. The first is to the theory of thermodynamic fluctuations. The formulation is exact, covariant under changes of coord...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
- CoRR
دوره abs/1009.5761 شماره
صفحات -
تاریخ انتشار 2010